Whither Music IR Evaluation Infrastructure: Lessons to be Learned from TREC

نویسنده

  • Ellen M. Voorhees
چکیده

Benchmark tasks are a powerful method for advancing the state of the art in a field. The music information retrieval community recently acknowledged the utility of such tasks by resolving to create an evaluation framework for music information retrieval (MIR) tasks and music digital libraries (MDL). This paper describes the processes used in the Text REtrieval Conference (TREC) evaluations to create information retrieval evaluation infrastructure, reviews assessments of how appropriate the evaluation methodology is for TREC tasks, and makes suggestions regarding the development of an MIR/MDL evaluation framework based on TREC experience.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Evaluation of Question Answering Systems: Lessons Learned from the TREC QA Track

The TREC question answering (QA) track was the first large-scale evaluation of open-domain question answering systems. In addition to successfully fostering research on the QA task, the track has also been used to investigate appropriate evaluation methodologies for question answering systems. This paper gives a brief history of the TREC QA track, motivating the decisions made in its implementa...

متن کامل

Some Lessons Learned To Date from the TREC Legal Track ( 2006 - 2009 )

For four years now, the Text Retrieval Conference (TREC) Legal Track administered by the US National Institute of Standards (NIST) has untaken yearly studies evaluating the application of Information Retrieval (IR) methods to e-discovery in the context of U.S. civil litigation. In this short paper, we distill some of what has been learned. As we write this, analysis is not yet complete for some...

متن کامل

Lessons Learned from the CHiC and SBS Interactive Tracks: A Wishlist for Interactive IR Evaluation

Over the course of the past two decades, the Interactive Tracks at TREC and INEX have contributed greatly to our knowledge of how to run an interactive IR evaluation campaign. In this position paper, we add to this body of knowledge by taking stock of our own experiences and challenges in organizing the CHIC and SBS Interactive Tracks from 2013 to 2016 in the form of a list of important propert...

متن کامل

Analysis of Biomedical and Health Queries: Lessons Learned from TREC and CLEF Evaluation Benchmarks

A large body of research work examined, from both the query side and the user behaviour side, the characteristics of medical and health-related searches. One of the core issues in medical information retrieval is diversity of tasks that lead to diversity of categories of information needs and queries. From the evaluation perspective, another related challenging issue is the limited availability...

متن کامل

Indri TREC Notebook 2006: Lessons Learned From Three Terabyte Tracks

This report describes the lessons learned using the Indri search system during the 2004-2006 TREC Terabyte Tracks. We provide an overview of Indri, and, for the ad hoc and named page finding tasks, discuss our general approach to the problem, what worked, what did not work, and what could possibly work in the future.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002